Optimal error exponents in hidden Markov models order estimation

نویسندگان

  • Elisabeth Gassiat
  • Stéphane Boucheron
چکیده

We consider the estimation of the number of hidden states (the order) of a discrete-time finite-alphabet hidden Markov model (HMM). The estimators we investigate are related to code-based order estimators: penalized maximum-likelihood (ML) estimators and penalized versions of the mixture estimator introduced by Liu and Narayan. We prove strong consistency of those estimators without assuming any a priori upper bound on the order and smaller penalties than previous works. We prove a version of Stein’s lemma for HMM order estimation and derive an upper bound on underestimation exponents. Then we prove that this upper bound can be achieved by the penalized ML estimator and by the penalized mixture estimator. The proof of the latter result gets around the elusive nature of the ML in HMM by resorting to large-deviation techniques for empirical processes. Finally, we prove that for any consistent HMM order estimator, for most HMM, the overestimation exponent is null.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Frame-dependent multi-stream reliability indicators for audio-visual speech recognition

We investigate the use of local, frame-dependent reliability indicators of the audio and visual modalities, as a means of estimating stream exponents of multi-stream hidden Markov models for audio-visual automatic speech recognition. We consider two such indicators at each modality, defined as functions of the speechclass conditional observation probabilities of appropriate audioor visual-only ...

متن کامل

Efficient Estimation of Markov Models Where the Transition Density is Unknown

In this paper we consider the estimation of Markov models where the transition density is unknown. The approach we propose is the empirical characteristic function (ECF) estimation procedure with an approximate optimal weight function. The approximate optimal weight function is obtained through an Edgeworth/Gram-Charlier expansion of the logarithmic transition density of the Markov process. Bas...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

Water Quality Index Estimation Model for Aquaculture System Using Artificial Neural Network

Water Quality plays an important role in attaining a sustainable aquaculture system, its cumulative effect can make or mar the entire system. The amount of dissolved oxygen (DO) alongside other parameters such as temperature, pH, alkalinity and conductivity are often used to estimate the water quality index (WQI) in aquaculture. There exist different approaches for the estimation of the quality...

متن کامل

Active Inference for Binary Symmetric Hidden Markov Models

We consider active maximum a posteriori (MAP) inference problem for Hidden Markov Models (HMM), where, given an initial MAP estimate of the hidden sequence, we select to label certain states in the sequence to improve the estimation accuracy of the remaining states. We develop an analytical approach to this problem for the case of binary symmetric HMMs, and obtain a closed form solution that re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2003